Fast company logo
|
advertisement

‘The Markup’ studied the newsfeeds of 2,601 volunteers to better understand how Facebook’s algorithm serves different news to different groups.

Biden and Trump supporters see two different Facebooks, and here’s proof

[Source photo:
Maxim Ilyahov
/Unsplash]

BY Mark Sullivan2 minute read

The nonprofit news organization The Markup launched a new tool on Thursday that compares side by side the Facebook “filter bubbles” of Biden supporters versus Trump supporters. They are two very different worlds.

The new tool, called Split Screen, is one of the first fruits of The Markup’s Citizen Browser Project, in which the group paid 2,601 people to report—via a special browser—the unique mix of content Facebook’s algorithm serves them based on their demographics and political leanings, among many other factors

The tool can compare the news posts, group recommendations, and hashtags that are likely to be suggested to Biden voters (on the left) versus Trump voters (on the right):

For the Biden crowd, Facebook was more likely to show content from NPR, The New York Times, NBC News, and The Washington Post. It was far more likely to serve the Trump crowd articles from The Daily Wire and Fox News, and somewhat more likely to serve them articles from CNSNews.com and Newsmax.

According to Split Screen, Biden voters were far more likely to see recommendations for groups about Star Trek memes compared to Trump voters. The algorithm was modestly more likely to suggest wholesome comedy groups to Trump supporters.

The hashtags served to the Trump and Biden crowds during the last two weeks were different, but their relative popularity wasn’t heavily influenced by partisanship.

advertisement

(Note that the sample size of Biden users is much larger than the sample of Trump voters. The Markup says it had difficulty finding people who identified as Trump supporters—perhaps for the same reasons that pollsters had trouble connecting with them before the election.)

Facebook’s newsfeed algorithm is designed to show you news posts it thinks you’ll like and agree with, so that you’ll keep scrolling and viewing ads for as long as possible. That means you’ll likely see a very different set of news articles in your feed than someone with political views different than your own. That’s commonly referred to as a “filter bubble,” a phenomenon that’s been hard to study because the algorithms that create them are hidden deep within a black box at Facebook.

Facebook disputes the idea that it exacerbates hyperpartisanship by creating filter bubbles, saying that on the contrary it exposes people to a larger variety of news sources than ever before.

It’s important to note that a user’s political persuasion isn’t the only factor considered by Facebook’s algorithm when choosing content. A Daily Wire article could show up in a user’s newsfeed just because one of their right-leaning friends shared it with them, not because of any deep thinking by the algorithm. The Markup’s researchers acknowledge that their tool looks for differences in relative popularity by political group regardless of the reason the content was served.

Split Screen doesn’t just do politics. It can also compare the newsfeeds of men vs. women, and Millennials vs. Baby Boomers. And you can compare newsfeeds from any one-, two-, or four-week period going back to December 1, 2020.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the final deadline, June 7.

Sign up for Brands That Matter notifications here.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Mark Sullivan is a senior writer at Fast Company, covering emerging tech, AI, and tech policy. Before coming to Fast Company in January 2016, Sullivan wrote for VentureBeat, Light Reading, CNET, Wired, and PCWorld More


Explore Topics